Alternating series

In mathematics, an alternating series is an infinite series of the form

\sum_{n=0}^\infty (-1)^n\,a_n,

with an ≥ 0 (or an ≤ 0) for all n. Like any series, an alternating series converges if and only if the associated sequence of partial sums converges.

Contents

Alternating series test

The theorem known as "Leibniz Test" or the alternating series test tells us that an alternating series will converge if the terms an converge to 0 monotonically.

Proof: Suppose the sequence a_n converges to zero and is monotone decreasing. If m is odd and m<n, we obtain the estimate S_m - S_n < a_{m%2B1} via the following calculation:


\begin{align}
S_m - S_n & =
\displaystyle\left|\sum_{k=0}^m(-1)^k\,a_k\,-\,\sum_{k=0}^n\,(-1)^k\,a_k\right| = \displaystyle\left|\sum_{k=m%2B1}^n\,(-1)^k\,a_k\right| \\
& =a_{m%2B1}-a_{m%2B2}%2Ba_{m%2B3}-a_{m%2B4}%2B\cdots%2Ba_n\\
& =\displaystyle a_{m%2B1}-(a_{m%2B2}-a_{m%2B3}) - (a_{m%2B4}-a_{m%2B5}) -\cdots-a_n<a_{m%2B1}
\end{align}

Since a_n is monotonically decreasing, the terms -(a_m - a_{m%2B1}) are negative. Thus, we have the final inequality S_m - S_n < a_{m%2B1}. Since a_{m%2B1} converges to 0, our partial sums S_m form a cauchy sequence (i.e. the series satisfies the cauchy convergence criterion for series) and therefore converge. The argument for m even is similar.

Approximating sums

The estimate above does not depend on n. So, if a_n is approaching 0 monotonically, the estimate provides an error bound for approximating infinite sums by partial sums:

\left|\sum_{k=0}^\infty(-1)^k\,a_k\,-\,\sum_{k=0}^m\,(-1)^k\,a_k\right|<a_{m%2B1}.

Absolute convergence

A series \sum a_n converges absolutely if the series \sum |a_n| converges.

Theorem: Absolutely convergent series are convergent.

Proof: Suppose \sum a_n is absolutely convergent. Then, \sum |a_n| is convergent and it follows that \sum 2|a_n| converges as well. Since  0 \leq a_n %2B |a_n| \leq 2|a_n|, the series \sum (a_n %2B |a_n|) converges by the comparison test. Therefore, the series \sum a_n converges as the difference of two convergent series \sum a_n = \sum (a_n %2B |a_n|) - \sum |a_n|.

Conditional convergence

A series is conditionally convergent if it converges but does not converge absolutely.

For example, the harmonic series

\sum_{n=1}^\infty \frac{1}{n},\!

diverges, while the alternating version

\sum_{n=1}^\infty \frac{(-1)^{n%2B1}}{n},\!

converges by the alternating series test.

Rearrangements

For any series, we can create a new series by rearranging the order of summation. A series is unconditionally convergent if any rearrangement creates a series with the same convergence as the original series. Absolutely convergent series are unconditionally convergent. But the Riemann series theorem states that conditionally convergent series can be rearranged to create arbitrary convergence.[1] The general principle is that addition of infinite sums is only associative for absolutely convergent series.

For example, this false proof that 1=0 exploits the failure of associativity for infinite sums.

As another example, we know that \ln(2) = \sum_{n=1}^\infty \frac{(-1)^{n%2B1}}{n} = 1 - \frac{1}{2} %2B \frac{1}{3} - \frac{1}{4} %2B \cdots.

But, since the series does not converge absolutely, we can rearrange the terms to obtain a series for \frac{1}{2}\ln(2):


\begin{align}
& {} \quad \left(1-\frac{1}{2}\right)-\frac{1}{4}%2B\left(\frac{1}{3}-\frac{1}{6}\right)-\frac{1}{8}%2B\left(\frac{1}{5}-\frac{1}{10}\right)-\frac{1}{12}%2B\cdots \\[8pt]
& = \frac{1}{2}-\frac{1}{4}%2B\frac{1}{6}-\frac{1}{8}%2B\frac{1}{10}-\frac{1}{12}%2B\cdots \\[8pt]
& = \frac{1}{2}\left(1-\frac{1}{2}%2B\frac{1}{3}-\frac{1}{4}%2B\frac{1}{5}-\frac{1}{6}%2B\cdots\right)= \frac{1}{2} \ln(2)
\end{align}

Series acceleration

In practice, the numerical summation of an alternating series may be sped up using any one of a variety of series acceleration techniques. One of the oldest techniques is that of Euler summation, and there are many modern techniques that can offer even more rapid convergence.

See also

References

Notes